English 中文 日本語 한국어 Русский

Capturing Motion,
Crafting Stories

Explore Our Case Studies: Transforming Motion into Masterpieces Across Industries

Human-vehicle collaborative navigation and positioning technology optimization

Client
Harbin Institute of Technology
Capture volume
Application
motion capture system, unmanned vehicle,collaborative navigation technology
Objects
unmanned vehicle
Equipment used

Navigation technology serves a major role in our day to day lives and commonly serves pedestrians. Pedestrian navigation systems are portable devices capable of providing directions and other navigation services for pedestrians. These devices may be used where satellite signals cannot reach, such as underground and in mines, as well as in areas with complex topological structures, such as shopping malls. Pedestrian navigation systems typically implement a micro inertial measurement unit (MIMU), which are prone to measurement error and poor accuracy. Thus, to improve the accuracy of pedestrian navigation systems, researchers are developing ways to minimize error and improve accuracy.

1657866994113041.jpg

One way to improve the accuracy of pedestrian navigation systems is through collaborative navigation technology. Collaborative navigation shares navigation information, inter-nodal range measurements, node locations, and error approximations between users, strengthening the overall accuracy of the system.

Collaborative navigation systems commonly incorporate both pedestrian nodes and autonomous vehicle nodes. Autonomous vehicles are typically equipped with various navigation equipment, including cameras, radars, global navigation satellite systems (GNSS), high-precision inertial sensors, etc., that allow for a more accurate navigation system. Thus, collaborative navigation technology between pedestrian navigation systems and autonomous vehicles may significantly strengthen the navigation accuracy of pedestrian systems, in turn improving the overall accuracy of the multi-user collaborative navigation system.

Researchers from the Harbin Institute of Technology have proposed a collaborative navigation system between pedestrian and autonomous vehicle nodes based on graph factorization. They have also developed an algorithm to optimize the navigation system to seamlessly incorporate all data received.

1657867098966814.jpg

Overall architecture diagram of collaborative navigation system

Two nodes in the system are composed of autonomous vehicles with high-accuracy navigation systems, while one follower node is composed of a pedestrian navigation system. The pedestrian navigation node is equipped with an inertial measurement unit based on micro-electromechanical system (MEMS) technology that measures the inertial data of the pedestrian’s movement. The positioning information of the autonomous vehicle nodes relative to the navigation coordinate system is provided externally: both autonomous vehicle and pedestrian navigation nodes are equipped with ultra-wideband (UWB) ranging devices to measure the true distance between nodes.

1657867179271891.jpg

UWB base station with reflective marking points pasted

The researchers tested the collaborative navigation system’s graph factorization algorithms through a walking navigation experiment. Two mobile UWB emitters were established as pilots in place of autonomous vehicles and their real-time locations were recorded by a NOKOV motion capture system. The pedestrian navigation system was equipped with MIMU and UWB sensors, allowing the system to measure the body’s angular velocity and acceleration, as well as the distance between the pedestrian and the two pilot nodes.

1657867286223503.jpg

NOKOV Motion Capture System

A total of 49.7 meters were walked throughout the length of the experiment. During the walking process, the MIMU system measured the body’s angular velocity and the acceleration of the subject’s feet. Meanwhile, UWB devices measured the distance between the subject’s feet and the two pilots. This data was processed by the collaborative navigation algorithm to provide the pedestrian navigation system with directions. The sub-millimeter accuracy of the NOKOV motion capture system allowed the true locations and trajectories of the objects in the experiment to be recorded.

1657867362738364.jpg

Coordinated Navigation Algorithm Solving Trajectory Diagram

The validity and accuracy of the collaborative navigation algorithm were evaluated by comparing the pedestrian node trajectory calculated by the algorithm with the real trajectory recorded by the NOKOV  motion capture system. The blue line represents the pedestrian node trajectory derived from the collaborative navigation algorithm, while the black line represents the real trajectory provided by the motion capture system. The researchers experimentally concluded that the error of the walking end point of the collaborative navigation system is 0.0648 meters, while the error rate of the relative walking distance is 0.13%.

Bibliography:

[1] Huang Can. Human-vehicle collaborative navigation algorithm based on factor graph [D]. Harbin Institute of Technology, 2021.

Prev
Applications of motion capture systems in wire-driven continuum robot research
Next
Applications of motion capture in the development of multi-sensor navigation technology for unmanned vehicles

Applications of motion capture systems in wire-driven continuum robot research

Sichuan University
2022-06-17

Applications of Motion Capture Systems for Robot Joint Displacement and Geometric Parameter Calibration

School of Aerospace Engineering and Applied Mechanics,Tongji University
2022-06-18

Applications of motion capture for snake movement analysis and snake robot development

Changchun University of Science and Technology
2022-06-22

Motion Capture Real-time Display at 2022HEEC

2022 Higher Education Expo China in Xi`an
2022-11-08

By using this site, you agree to our terms, which outline our use of cookies. CLOSE ×

Contact us
We are committed to responding promptly and will connect with you through our local distributors for further assistance.
Engineering Virtual Reality Movement Sciences Entertainment
I would like to receive a quote
Beijing NOKOV Science & Technology Co., Ltd (Headquarter)
Room820, China Minmetals Tower, Chaoyang Dist., Beijing
info@nokov.cn
+ 86-10-64922321
Capture Volume*
Objective*
Full Bodies Drones/Robots Others
Quantity
Camera Type
Pluto1.3C Mars1.3H Mars2H Mars4H Underwater Others/I do not know
Camera Count
4 6 8 12 16 20 24 Others/I don't know